Symbolic-regression boosting

نویسندگان

چکیده

Modifying standard gradient boosting by replacing the embedded weak learner in favor of a strong(er) one, we present SyRBo: Symbolic-Regression Boosting. Experiments over 98 regression datasets show that adding small number stages -- between 2--5 to symbolic regressor, statistically significant improvements can often be attained. We note coding SyRBo on top any regressor is straightforward, and added cost simply few more evolutionary rounds. essentially simple add-on readily an extant with beneficial results.

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Outlier Detection by Boosting Regression Trees

A procedure for detecting outliers in regression problems is proposed. It is based on information provided by boosting regression trees. The key idea is to select the most frequently resampled observation along the boosting iterations and reiterate after removing it. The selection criterion is based on Tchebychev’s inequality applied to the maximum over the boosting iterations of ...

متن کامل

Boosting for Regression Transfer

The goal of transfer learning is to improve the learning of a new target concept given knowledge of related source concept(s). We introduce the first boosting-based algorithms for transfer learning that apply to regression tasks. First, we describe two existing classification transfer algorithms, ExpBoost and TrAdaBoost, and show how they can be modified for regression. We then introduce extens...

متن کامل

Boosting Regression via Classiication

Boosting strategies are methods of improving the accuracy of a prediction (a classii-cation rule) by combining many weakerr predictions, each of which is only moderately accurate. In this paper we present a concise analysis of the Freund and Shapire's AdaBoost algorithm FS97] from which we derive a new boosting strategy for the regression case which is an extension of the algorithm discussed in...

متن کامل

On boosting kernel regression

In this paper we propose a simple multistep regression smoother which is constructed in an iterative manner, by learning the Nadaraya-Watson estimator with L2boosting. We find, in both theoretical analysis and simulation experiments, that the bias converges exponentially fast, and the variance diverges exponentially slow. The first boosting step is analyzed in more detail, giving asymptotic exp...

متن کامل

Boosting Regression Estimators

There is interest in extending the boosting algorithm (Schapire, 1990) to fit a wide range of regression problems. The threshold-based boosting algorithm for regression used an analogy between classification errors and big errors in regression. We focus on the practical aspects of this algorithm and compare it to other attempts to extend boosting to regression. The practical capabilities of thi...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Genetic Programming and Evolvable Machines

سال: 2021

ISSN: ['1389-2576', '1573-7632']

DOI: https://doi.org/10.1007/s10710-021-09400-0